Skip to content

Conversation

@hsn10
Copy link

@hsn10 hsn10 commented Sep 14, 2014

patch for SPARK-3482

@SparkQA
Copy link

SparkQA commented Sep 14, 2014

Can one of the admins verify this patch?

@srowen
Copy link
Member

srowen commented Sep 14, 2014

Again, still looks like a duplicate of #1875

bin/spark-shell Outdated
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may have to quote these - so that dirs with spaces within their name work. Like they do at the moment.
Above should look like.
FWDIR="$(cd "$(dirname "$(readlink -f "$0")")"/..; pwd)"

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ofcourse this applies to all other places as well.

@roji
Copy link

roji commented Sep 30, 2014

Note that this PR won't support cases of symlink to symlink to Spark (i.e. multiple redirections). The PR I submitted at some point (#1875) shows how to implement this properly with a loop.

@marmbrus
Copy link
Contributor

marmbrus commented Dec 2, 2014

Thanks for working on this, however, since its a duplicate I think we should probably close this issue and continue any discussion on #1875.

@asfgit asfgit closed this in b0a46d8 Dec 2, 2014
ghost pushed a commit to dbtsai/spark that referenced this pull request Nov 4, 2015
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot.

For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink.

Instead of using `readlink -f` like what this PR (apache#2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop.

I've tested with Mac and Linux (Cent OS), looks fine.

This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also?

Please help to review, any comment is greatly appreciated.

Author: jerryshao <[email protected]>
Author: Shay Rojansky <[email protected]>

Closes apache#8669 from jerryshao/SPARK-2960.
kiszk pushed a commit to kiszk/spark-gpu that referenced this pull request Dec 26, 2015
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot.

For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink.

Instead of using `readlink -f` like what this PR (apache/spark#2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop.

I've tested with Mac and Linux (Cent OS), looks fine.

This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also?

Please help to review, any comment is greatly appreciated.

Author: jerryshao <[email protected]>
Author: Shay Rojansky <[email protected]>

Closes #8669 from jerryshao/SPARK-2960.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants